entropy estimate for maps on forests

نویسندگان

m. sabbaghan

چکیده

a 1993 result of j. llibre, and m. misiurewicz, (theorem a [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. also a 1980 result of l.s. block, j. guckenheimer, m. misiurewicz and l.s. young (lemma 1.5 [3]) states that if g is an a-graph of f then h(g) ? h( f ). in this paper we generalize theorem a and lemma 1.5 for continuous functions on forests. let f be a forest and f : f?f be a continuous function. by using the adjacency matrix of a graph, we give a lower bound for the topological entropy of f.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropy Estimate for Maps on Forests

A 1993 result of J. Llibre, and M. Misiurewicz, (Theorem A [5]), states that if a continuous map f of a graph into itself has an s-horseshoe, then the topological entropy of f is greater than or equal to logs, that is h( f ) ? logs. Also a 1980 result of L.S. Block, J. Guckenheimer, M. Misiurewicz and L.S. Young (Lemma 1.5 [3]) states that if G is an A-graph of f then h(G) ? h( f ). In this pap...

متن کامل

Entropy Estimate For High Dimensional Monotonic Functions

We establish upper and lower bounds for the metric entropy and bracketing entropy of the class of d-dimensional bounded monotonic functions under L norms. It is interesting to see that both the metric entropy and bracketing entropy have different behaviors for p < d/(d − 1) and p > d/(d − 1). We apply the new bounds for bracketing entropy to establish a global rate of convergence of the MLE of ...

متن کامل

Do Hebbian synapses estimate entropy?

Hebbian learning is one of the mainstays of biologically inspired neural processing. Hebb’s rule is biologically plausible, and it has been extensively utilized in both computational neuroscience and in unsupervised training of neural systems. In these fields, Hebbian learning became synonymous for correlation learning. But it is known that correlation is a second order statistic of the data, s...

متن کامل

Pre-image Entropy for Maps on Noncompact Topological Spaces

We propose a new definition of pre-image entropy for continuous maps on noncompact topological spaces, investigate fundamental properties of the new pre-image entropy, and compare the new pre-image entropy with the existing ones. The defined pre-image entropy generates that of Cheng and Newhouse. Yet, it holds various basic properties of Cheng and Newhouse’s pre-image entropy, for example, the ...

متن کامل

منابع من

با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید


عنوان ژورنال:
journal of sciences, islamic republic of iran

ناشر: university of tehran

ISSN 1016-1104

دوره 21

شماره 1 2010

کلمات کلیدی

میزبانی شده توسط پلتفرم ابری doprax.com

copyright © 2015-2023